Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
نویسندگان
چکیده
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.
منابع مشابه
On the Maximum Storage Capacity of the Hopfield Model
Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfi...
متن کاملOn the Storage Capacity of an Abstract Cortical Model with Silent Hypercolumns
In this report we investigate the storage capacity of an abstract generic attractor neural network model of the mammalian cortex. This model network has a diluted connection matrix and a fixed activity level that is independent of network size. We develop an analytical model of the storage capacity for this type of networks when they are trained with both the Willshaw and Hopfield learning-rule...
متن کاملInformation retrieval in neural networks. - I. Eigenproblems in neural networks
2014 Consideration of the eigenproblem of the synaptic matrix in Hopfield’s model of neural networks suggests to introduce a matrix built up from an orthogonal set, orthogonal to the original memories. With this new scheme, capacity storage is significantly enhanced and robustness at least conserved. Revue Phys. Appl. 22 (1987) 1321-1325 OCTOBRE 1987, PAGE Classification Physics Abstracts 42.30...
متن کاملEffects of noise in training patterns on the memory capacity of the fully connected binary Hopfield neural network: mean-field theory and simulations
We show that the memory capacity of the fully connected binary Hopfield network is significantly reduced by a small amount of noise in training patterns. Our analytical results obtained with the mean field method are supported by extensive computer simulations.
متن کاملA High-Storage Capacity Content-Addressable Memory and Its Learning Algorithm
Abstrud -Hopfield’s neural networks show retrieval and speed capabilities that make them good candidates for content-addressable memories (CAM’s) in problems such as pattern recognition and optimization. This paper presents a new implementation of a VLSI fully interconnected neural network with only two binary memory points per synapse (the connection weights are restricted to three different v...
متن کامل